find gender and age bias
Audit finds gender and age bias in OpenAI's CLIP model
All the sessions from Transform 2021 are available on-demand now. In January, OpenAI released Contrastive Language-Image Pre-training (CLIP), an AI model trained to recognize a range of visual concepts in images and associate them with their names. CLIP performs quite well on classification tasks -- for instance, it can caption an image of a dog "a photo of a dog." But according to an OpenAI audit conducted with Jack Clark, OpenAI's former policy director, CLIP is susceptible to biases that could have implications for people who use -- and interact -- with the model. Prejudices often make their way into the data used to train AI systems, amplifying stereotypes and leading to harmful consequences.
Technology:
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (1.00)